Skip to content

Add assemble-sdks CI job and add Termux packages necessary for XML and networking #167

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Oct 7, 2024

Conversation

marcprux
Copy link
Contributor

This PR adds a workflow job that runs at the end of the CI cycle. It takes the artifacts from the build-sdk-and-tests for each of the versions (release, devel, and trunk) and aggregates them into a single SDK archive. This is the start of addressing #163, but does not yet generate a toolchain.json file or an .artifactbundle.

You can see the artifacts that it will generate at https://github.com/marcprux/swift-android-sdk/actions/runs/10781519718, which creates swift-5.10.1-RELEASE-android-sdk.tar.xz, swift-6.0-DEVELOPMENT-SNAPSHOT-2024-07-19-a-android-sdk.tar.xz, and swift-DEVELOPMENT-SNAPSHOT-2024-07-15-a-android-sdk.tar.xz. These archives can be used with a custom destination JSON file (not included, since it will need to reference the local NDK and host toolchain locations, which will vary from machine to machine).

In addition, this PR adds the following Termux packages to the SDK, which enables FoundationXML and FoundationNetworking: zlib, libnghttp3, libnghttp2, libssh2, openssl, liblzma, and libiconv. With these included, XMLParser and URLSession are working (albeit not HTTPS yet – we need to be able to hook into the Android certificate store somehow).

@marcprux
Copy link
Contributor Author

I think the CI error is due to #169. Manually re-running the failed job would hopefully make it pass.

@marcprux
Copy link
Contributor Author

marcprux commented Sep 24, 2024

I've updated this PR with the latest 6.0 work, and the resulting swift-6.0-RELEASE-android-sdk.tar.xz artifact works well from my initial experimentation.

A couple of observations:

  1. libc++_shared.so seems to no longer be included in /usr/lib. Not sure what changed (since it was included in the 5.10 SDK), but for now it needs to be manually copied from the NDK along with the other shared objects to get a test executable to run on Android.
  2. swift-testing is not being built. I am currently experimenting with adding it to the build-script flags in https://github.com/marcprux/swift-android-sdk/pull/11

id: download-artifacts
with:
pattern: swift-android-sdk-${{ matrix.version }}-*
uses: actions/download-artifact@v4
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I stopped uploading the single-arch SDKs: get them from the cache instead (note the new key) and we can just upload the three assembled multi-arch SDKs at the end.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What am I doing wrong here? https://github.com/marcprux/swift-android-sdk/actions/runs/11022198041/job/30610978833#step:8:13 says:

Cache saved with key: swift-6.0-RELEASE-armv7-ndk-27b-sdk

But then https://github.com/marcprux/swift-android-sdk/actions/runs/11022198041/job/30613448494#step:3:9 says:

Error: Failed to restore cache entry. Exiting as fail-on-cache-miss is set. Input key: swift-6.0-RELEASE-armv7-ndk-27b-sdk

Is cache access restricted to the same job that created it? It was working fine when I was downloading the uploaded artifact.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've reverted to using upload-artifact/download-artifact to get the SDKs to the assemble-sdks job, but happy to entertain any alternative suggestions you might have for accessing the cache files.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What am I doing wrong here?

You're using the wrong path, compare it to the one I used above. Now that it's just a restore action, maybe you can leave off the path altogether.

Is cache access restricted to the same job that created it?

Of course not, I specifically created an initial toolchain job to populate the caches for the next jobs that actually build the code.

Sounds like you were just working too late, same thing happened with me and the issues I had with working around the latest CMake 3.30. 😉

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BTW, I wonder if we should be including the hashes of the build flags and support files in the cache key (as described at https://github.com/actions/cache#creating-a-cache-key). For example, I added "swift-testing" to get-packages-and-swift-source.swift, but since changes to that file don't invalidate the cache, the stale cache files will be used until they are deleted (or age out), resulting in a cached SDK that doesn't reflect the most recent build configuration.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's a good idea, but since the files in this repo aren't changed that often while the cached SDKs are continually rebuilt because of new Swift snapshot tags, I haven't bothered.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

An alternative to this might be to suffix the cache ID with the PR identifier (if any; like with ${{ github.event.pull_request.number || github.event.issue.number }}), which would compartmentalize the cached files for a certain PR. This will trigger an SDK re-build for each pull request (which I want), and it will have the additional benefit of a PR's build not accidentally "poisoning" the canonical main cache if it happens to run between when a new swift devel/trunk/release version is made available and the daily action on main is run.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As an example of why this would be useful, I would like #176 to have triggered a re-build of the SDK because it includes changes to get-packages-and-swift-source.swift, but the action simply re-uses the cached SDK and thereby skips exercising the changes altogether.

Copy link
Owner

@finagolfin finagolfin Oct 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually like the current behavior of reusing the cached SDKs as much as possible, as it takes almost an hour on the CI to rebuild each SDK, so for most changes that don't require rebuilding the SDK, this is the appropriate default. I simply have to remember to change the cache key anytime I want the SDKs rebuilt, which hasn't been a problem so far. Btw, the PR cache is separate from the scheduled CI cache, so if something is added to the cache in this pull, it won't poison the regular cache.

I suggest you set the cache key in a single place in this pull, say in the first step where the snapshot tag is currently set, write it to $GITHUB_OUTPUT and then use it in all four places where the SDK cache key is now checked after this pull. That way, we can manually change that key string in one place, rather than having to update four.

If you'd like, also add the hash for the Swift script and 11 patches applied to the SDK source to the SDK cache key, so you don't have to manually bump the key each time. You'll have to move cloning this repo in the CI to the first step, so you can hash those files in the now-second step.

@marcprux marcprux marked this pull request as draft September 24, 2024 21:12
@marcprux marcprux marked this pull request as ready for review September 25, 2024 19:23
@marcprux
Copy link
Contributor Author

OK, I think this is ready for review. The resulting SDK artifact works on my local macOS machine, except for a weird error when building tests … I'll make a separate issue/PR for that.

@finagolfin
Copy link
Owner

libc++_shared.so seems to no longer be included in /usr/lib. Not sure what changed

Not sure what you mean: I see it in both my Swift 6 single-arch SDKs and in these multi-arch SDKs you're uploading now.

I've started reviewing this pull, will examine the uploaded SDK now and finish my review.

@marcprux
Copy link
Contributor Author

libc++_shared.so seems to no longer be included in /usr/lib. Not sure what changed

Not sure what you mean: I see it in both my Swift 6 single-arch SDKs and in these multi-arch SDKs you're uploading now.

This was just an error on my part: I was copying it to the wrong location.

@finagolfin
Copy link
Owner

Alright, ran the Swift 6.0.1 compiler validation suite natively on my Android phone and am happy to report no regressions compared to my recent native trunk 6.1 builds, just some tests that have been updated in trunk but are still either misconfigured or mislabeled in the older 6.0 branch.

Next step is to put together the Swift 6.0.1 SDK bundle for Android and test it a bit before releasing. Once the CI can produce a multi-arch SDK bundle, I'll switch it to use those cached CI-prepared multi-arch SDK bundles when there's no new toolchain snapshot to compile first.

I'll take a final look at the contents of this CI-generated multi-arch SDK, suggest some more tweaks, and then we can get this in. I will then use this CI-generated SDK to build an SDK bundle, as discussed in #163, and finally release that.

@finagolfin
Copy link
Owner

I notice a couple more files in usr/etc/, usr/lib/engines-3/, and usr/lib/ossl-modules/ that were added as part of your Termux library additions: can you remove them if not needed? The config file doesn't matter much, but the shared libraries are wrongly available for only a single arch, if needed for the other Termux libraries.

@finagolfin
Copy link
Owner

Time to squash all the commits, I think a last few tweaks and we can get this in.

@finagolfin
Copy link
Owner

Alright, other than those last tweaks and your separate rpath pull, this is looking good. Once you make those last few changes, I will merge and the CI will produce the multi-arch SDKs from now on.

No need to add separate commits for each change anymore, mash it all into a final commit now that we're reaching the end.

Thanks for working on this, 😄 hope you don't mind all my tweaks.

@marcprux
Copy link
Contributor Author

marcprux commented Oct 7, 2024

I've handled the final nits. Ready for squash and merge!

@finagolfin finagolfin merged commit 46e202c into finagolfin:main Oct 7, 2024
3 of 18 checks passed
@finagolfin
Copy link
Owner

Thanks for all your work on this, Marc! 😃 I will start using this multi-arch SDK to create an SDK bundle locally, then merge those changes back to the CI this week and release.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants